Search results for "Computational statistics"

showing 5 items of 5 documents

On the stability of some controlled Markov chains and its applications to stochastic approximation with Markovian dynamic

2015

We develop a practical approach to establish the stability, that is, the recurrence in a given set, of a large class of controlled Markov chains. These processes arise in various areas of applied science and encompass important numerical methods. We show in particular how individual Lyapunov functions and associated drift conditions for the parametrized family of Markov transition probabilities and the parameter update can be combined to form Lyapunov functions for the joint process, leading to the proof of the desired stability property. Of particular interest is the fact that the approach applies even in situations where the two components of the process present a time-scale separation, w…

65C05FOS: Computer and information sciencesStatistics and ProbabilityLyapunov functionStability (learning theory)Markov processContext (language use)Mathematics - Statistics Theorycontrolled Markov chainsStatistics Theory (math.ST)Stochastic approximation01 natural sciencesMethodology (stat.ME)010104 statistics & probabilitysymbols.namesake60J05stochastic approximationFOS: MathematicsComputational statisticsApplied mathematics60J220101 mathematicsStatistics - MethodologyMathematicsSequenceMarkov chain010102 general mathematicsStability Markov chainssymbolsStatistics Probability and Uncertaintyadaptive Markov chain Monte Carlo
researchProduct

Adaptive independent sticky MCMC algorithms

2018

In this work, we introduce a novel class of adaptive Monte Carlo methods, called adaptive independent sticky MCMC algorithms, for efficient sampling from a generic target probability density function (pdf). The new class of algorithms employs adaptive non-parametric proposal densities which become closer and closer to the target as the number of iterations increases. The proposal pdf is built using interpolation procedures based on a set of support points which is constructed iteratively based on previously drawn samples. The algorithm's efficiency is ensured by a test that controls the evolution of the set of support points. This extra stage controls the computational cost and the converge…

FOS: Computer and information sciencesMathematical optimizationAdaptive Markov chain Monte Carlo (MCMC)Monte Carlo methodBayesian inferenceHASettore SECS-P/05 - Econometrialcsh:TK7800-8360Machine Learning (stat.ML)02 engineering and technologyBayesian inference01 natural sciencesStatistics - Computationlcsh:Telecommunication010104 statistics & probabilitysymbols.namesakeAdaptive Markov chain Monte Carlo (MCMC); Adaptive rejection Metropolis sampling (ARMS); Bayesian inference; Gibbs sampling; Hit and run algorithm; Metropolis-within-Gibbs; Monte Carlo methods; Signal Processing; Hardware and Architecture; Electrical and Electronic EngineeringGibbs samplingStatistics - Machine Learninglcsh:TK5101-67200202 electrical engineering electronic engineering information engineeringComputational statisticsMetropolis-within-GibbsHit and run algorithm0101 mathematicsElectrical and Electronic EngineeringGaussian processComputation (stat.CO)MathematicsSignal processinglcsh:Electronics020206 networking & telecommunicationsMarkov chain Monte CarloMonte Carlo methodsHardware and ArchitectureSignal ProcessingSettore SECS-S/03 - Statistica EconomicasymbolsSettore SECS-S/01 - StatisticaStatistical signal processingGibbs samplingAdaptive rejection Metropolis sampling (ARMS)EURASIP Journal on Advances in Signal Processing
researchProduct

SU-E-T-509: DICOM Test Case Plans for Model-Based Dose Calculations Methods in Brachytherapy

2013

Purpose: The TG‐186 report provides guidance to early adopters of model‐based dose calculation algorithms (MBDCAs) for brachytherapy. A charge of the AAPM‐ESTRO Working Group on MBDCA is to develop well‐defined test case plans, available as references for the software commissioning process to be performed by end‐users. The aim of this work is to develop test case plans for a generic HDR 192 Ir source alone and in combination with a vaginal cylinder applicator with 180° tungsten‐alloy shielding in a DICOM‐based water phantom. Methods: A DICOM CT dataset was created with a 30 cm diameter water sphere surrounded by air. The voxel size was 1.33×1.33×1.33 mm3 for evaluating absorbed dose rate (c…

Physicsbusiness.industrymedicine.medical_treatmentMonte Carlo methodBrachytherapyGeneral Medicinecomputer.software_genreImaging phantomDICOMVoxelConsistency (statistics)medicineComputational statisticsDosimetryNuclear medicinebusinesscomputerMedical Physics
researchProduct

A fast and recursive algorithm for clustering large datasets with k-medians

2012

Clustering with fast algorithms large samples of high dimensional data is an important challenge in computational statistics. Borrowing ideas from MacQueen (1967) who introduced a sequential version of the $k$-means algorithm, a new class of recursive stochastic gradient algorithms designed for the $k$-medians loss criterion is proposed. By their recursive nature, these algorithms are very fast and are well adapted to deal with large samples of data that are allowed to arrive sequentially. It is proved that the stochastic gradient algorithm converges almost surely to the set of stationary points of the underlying loss criterion. A particular attention is paid to the averaged versions, which…

Statistics and ProbabilityClustering high-dimensional dataFOS: Computer and information sciencesMathematical optimizationhigh dimensional dataMachine Learning (stat.ML)02 engineering and technologyStochastic approximation01 natural sciencesStatistics - Computation010104 statistics & probabilityk-medoidsStatistics - Machine Learning[MATH.MATH-ST]Mathematics [math]/Statistics [math.ST]stochastic approximation0202 electrical engineering electronic engineering information engineeringComputational statisticsrecursive estimatorsAlmost surely[ MATH.MATH-ST ] Mathematics [math]/Statistics [math.ST]0101 mathematicsCluster analysisComputation (stat.CO)Mathematicsaveragingk-medoidsRobbins MonroApplied MathematicsEstimator[STAT.TH]Statistics [stat]/Statistics Theory [stat.TH]stochastic gradient[ STAT.TH ] Statistics [stat]/Statistics Theory [stat.TH]MedoidComputational MathematicsComputational Theory and Mathematicsonline clustering020201 artificial intelligence & image processingpartitioning around medoidsAlgorithm
researchProduct

A Software Tool For Sparse Estimation Of A General Class Of High-dimensional GLMs

2022

Generalized linear models are the workhorse of many inferential problems. Also in the modern era with high-dimensional settings, such models have been proven to be effective exploratory tools. Most attention has been paid to Gaussian, binomial and Poisson settings, which have efficient computational implementations and where either the dispersion parameter is largely irrelevant or absent. However, general GLMs have dispersion parameters φ that affect the value of the log- likelihood. This in turn, affects the value of various information criteria such as AIC and BIC, and has a considerable impact on the computation and selection of the optimal model.The R-package dglars is one of the standa…

Statistics and ProbabilityNumerical Analysishigh-dimensional data dglars penalized inference computational statisticsStatistics Probability and UncertaintySettore SECS-S/01 - Statistica
researchProduct